Context-Sensitive MTL Networks for Machine Lifelong Learning

نویسندگان

  • Daniel L. Silver
  • Ryan Poirier
چکیده

Context-sensitive Multiple Task Learning, or csMTL, is presented as a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The csMTL method is tested on three task domains and shown to produce hypotheses for a primary task that are significantly better than standard MTL hypotheses when learning in the presence of related and unrelated tasks. A new measure of task relatedness, based on the context input weights, is shown to have promise. The paper also outlines a machine lifelong learning system that uses csMTL for sequentially learning multiple tasks. The approach satisfies a number of important requirements for knowledge retention and inductive transfer including the elimination of redundant outputs, representational transfer for rapid but effective short-term learning and functional transfer via task rehearsal for long-term consolidation.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Machine Life-Long Learning with csMTL Networks

Multiple task learning (MTL) neural networks are one of the better documented methods of inductive transfer of task knowledge (Caruana 1997). An MTL network is a feedforward multi-layer network with an output node for each task being learned. The standard back-propagation of error learning algorithm is used to train all tasks in parallel. The sharing of internal representation in the hidden nod...

متن کامل

csMTL: a Context Sensitive Lifelong Learning System

csMTL, or context-sensitive Multiple Task Learning, is presented as a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The csMTL approach is demonstrated to produce hypotheses that are equivalent to or better than standard MTL hypotheses when learning a primary task in the presence of related and unrelated tasks....

متن کامل

An Overview of Multi-Task Learning in Deep Neural Networks

Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery. This article aims to give a general overview of MTL, particularly in deep neural networks. It introduces the two most common methods for MTL in Deep Learning, gives an overview of the literature, and discusses rec...

متن کامل

Personalized Multitask Learning for Predicting Tomorrow’s Mood, Stress, and Health

While accurately predicting mood and wellbeing could have a number of important clinical benefits, traditional machine learning (ML) methods frequently yield low performance in this domain. We posit that this is because a one-size-fits-all machine learning model is inherently ill-suited to predicting outcomes like mood and stress, which vary greatly due to individual differences. Therefore, we ...

متن کامل

Phonetic Context Embeddings for DNN-HMM Phone Recognition

This paper proposes an approach, named phonetic context embedding, to model phonetic context effects for deep neural network hidden Markov model (DNN-HMM) phone recognition. Phonetic context embeddings can be regarded as continuous and distributed vector representations of context-dependent phonetic units (e.g., triphones). In this work they are computed using neural networks. First, all phone ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007